Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Comput Methods Programs Biomed ; 242: 107788, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37738838

RESUMO

BACKGROUND AND OBJECTIVE: Oral cancer is the sixth most common kind of human cancer. Brush cytology for counting Argyrophilic Nucleolar Organizer Regions (AgNORs) can help early mouth cancer detection, lowering patient mortality. However, the manual counting of AgNORs still in use today is time-consuming, labor-intensive, and error-prone. The goal of our work is to address these shortcomings by proposing a convolutional neural network (CNN) based method to automatically segment individual nuclei and AgNORs in microscope slide images and count the number of AgNORs within each nucleus. METHODS: We systematically defined, trained and tested 102 CNNs in the search for a high-performing solution. This included the evaluation of 51 network architectures combining 17 encoders with 3 decoders and 2 loss functions. These CNNs were trained and evaluated on a new AgNOR-stained image dataset of epithelial cells from oral mucosa containing 1,171 images from 48 patients, with ground truth annotated by specialists. The annotations were greatly facilitated by a semi-automatic procedure developed in our project. Overlapping nuclei, which tend to hide AgNORs, thus affecting their true count, were discarded using an automatic solution also developed in our project. Besides the evaluation on the test dataset, the robustness of the best performing model was evaluated against the results produced by a group of human experts on a second dataset. RESULTS: The best performing CNN model on the test dataset consisted of a DenseNet-169 + LinkNet with Focal Loss (DenseNet-169 as encoder and LinkNet as decoder). It obtained a Dice score of 0.90 and intersection over union (IoU) of 0.84. The counting of nuclei and AgNORs achieved precision and recall of 0.94 and 0.90 for nuclei, and 0.82 and 0.74 for AgNORs, respectively. Our solution achieved a performance similar to human experts on a set of 291 images from 6 new patients, obtaining Intraclass Correlation Coefficient (ICC) of 0.91 for nuclei and 0.81 for AgNORs with 95% confidence intervals of [0.89, 0.93] and [0.77, 0.84], respectively, and p-values < 0.001, confirming its statistical significance. Our AgNOR-stained image dataset is the most diverse publicly available AgNOR-stained image dataset in terms of number of patients and the first for oral cells. CONCLUSIONS: CNN-based joint segmentation and quantification of nuclei and NORs in AgNOR-stained images achieves expert-like performance levels, while being orders of magnitude faster than the later. Our solution demonstrated this by showing strong agreement with the results produced by a group of specialists, highlighting its potential to accelerate diagnostic workflows. Our trained model, code, and dataset are available and can stimulate new research in early oral cancer detection.


Assuntos
Neoplasias Bucais , Região Organizadora do Nucléolo , Humanos , Coloração pela Prata/métodos , Neoplasias Bucais/diagnóstico por imagem , Redes Neurais de Computação
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...